deBruijn identities: from Shannon, Kullback–Leibler and Fisher to generalized φ -entropies, φ -divergences and φ -Fisher informations
نویسندگان
چکیده
In this paper we propose a generalization of the usual deBruijn identity that links the Shannon differential entropy (or the Kullback–Leibler divergence) and the Fisher information (or the Fisher divergence) of the output of a Gaussian channel. The generalization makes use of φ -entropies on the one hand, and of φ -divergences (of the Csizàr class) on the other hand, as generalizations of the Shannon entropy and of the Kullback–Leibler divergence respectively. The generalized deBruijn identities induce the definition of generalized Fisher informations and generalized Fisher divergences; some of such generalizations exist in the literature. Moreover, we provide results that go beyond the Gaussian channel: we are then able to characterize a noisy channel using general measures of mutual information, both for Gaussian and non-Gaussian channels.
منابع مشابه
Generalization of the de Bruijn's identity to general $\phi$-entropies and $\phi$-Fisher informations
In this paper, we propose generalizations of the de Bruijn’s identities based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundation of these generalizations are the φ-entropies and divergences of the Csiszár’s class (or Salicrú’s class) considered within a multidimensional context, included the monodimensional case, and fo...
متن کاملObserved Diffusion Processes
In this paper we propose the use of φ-divergences as test statistics to verify simple hypotheses about a one-dimensional parametric diffusion process dXt = b(Xt, θ)dt + σ(Xt, θ)dWt, from discrete observations {Xti , i = 0, . . . , n} with ti = i∆n, i = 0, 1, . . . , n, under the asymptotic scheme ∆n → 0, n∆n → ∞ and n∆n → 0. The class of φ-divergences is wide and includes several special member...
متن کاملPhi-Divergence Constrained Ambiguous Stochastic Programs for Data-Driven Optimization
This paper investigates the use of φ-divergences in ambiguous (or distributionally robust) two-stage stochastic programs. Classical stochastic programming assumes the distribution of uncertain parameters are known. However, the true distribution is unknown in many applications. Especially in cases where there is little data or not much trust in the data, an ambiguity set of distributions can be...
متن کاملJensen divergence based on Fisher's information
The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and ...
متن کاملExtended inequalities for weighted Renyi entropy involving generalized Gaussian densities
In this paper the author analyzes the weighted Renyi entropy in order to derive several inequalities in weighted case. Furthermore, using the proposed notions α-th generalized deviation and (α, p)-th weighted Fisher information, extended versions of the moment-entropy, Fisher information and Cramér-Rao inequalities in terms of generalized Gaussian densities are given. 1 The weighted p-Renyi ent...
متن کامل